On integral probability metrics, \phi-divergences and binary classification
نویسندگان
چکیده
A class of distance measures on probabilities — the integral probability metrics (IPMs) — is addressed: these include the Wasserstein distance, Dudley metric, and Maximum Mean Discrepancy. IPMs have thus far mostly been used in more abstract settings, for instance as theoretical tools in mass transportation problems, and in metrizing the weak topology on the set of all Borel probability measures defined on a metric space. Practical applications of IPMs are less common, with some exceptions in the kernel machines literature. The present work contributes a number of novel properties of IPMs, which should contribute to making IPMs more widely used in practice, for instance in areas where φ-divergences are currently popular. First, to understand the relation between IPMs and φdivergences, the necessary and sufficient conditions under which these classes intersect are derived: the total variation distance is shown to be the only non-trivial φ-divergence that is also an IPM. This shows that IPMs are essentially different from φdivergences. Second, empirical estimates of several IPMs from finite i.i.d. samples are obtained, and their consistency and convergence rates are analyzed. These estimators are shown to be easily computable, with better rates of convergence than estimators of φ-divergences. Third, a novel interpretation is provided for IPMs by relating them to binary classification, where it is shown that the IPM between class-conditional distributions is the negative of the optimal risk associated with a binary classifier. In addition, the smoothness of an appropriate binary classifier is proved to be inversely related to the distance between the class-conditional distributions, measured in terms of an IPM.
منابع مشابه
On Integral Probability Metrics, φ-Divergences and Binary Classification
φ-divergences are a widely studied class of distance measures between probabilities. In this paper, a different class of distance measures on probabilities, called the integral probability metrics (IPMs) is considered. IPMs, for example, the Wasserstein distance and Dudley metric have, thus far, only been used in a limited setting, as theoretical tools in mass transportation problems, in metriz...
متن کاملA Note on Integral Probability Metrics and φ-divergences
We study some connections between integral probability metrics [21] of the form, γF(P,Q) := sup f∈F ̨
متن کاملData-Driven Stochastic Programming Using Phi-Divergences
Most of classical stochastic programming assumes that the distribution of uncertain parameters are known, and this distribution is an input to the model. In many applications, however, the true distribution is unknown. An ambiguity set of distributions can be used in these cases to hedge against the distributional uncertainty. Phi-divergences (Kullback–Leibler divergence, χ-distance, etc.) prov...
متن کاملEndpoints of generalized $phi$-contractive multivalued mappings of integral type
Recently, some researchers have established some results on existence of endpoints for multivalued mappings. In particular, Mohammadi and Rezapour's [Endpoints of Suzuki type quasi-contractive multifunctions, U.P.B. Sci. Bull., Series A, 2015] used the technique of $alpha-psi$-contractive mappings, due to Samet et al. (2012), to give some results about endpoints of Suzuki type quasi-contractiv...
متن کاملInformation, Divergence and Risk for Binary Experiments
We unify f -divergences, Bregman divergences, surrogate loss bounds (regret bounds), proper scoring rules, matching losses, cost curves, ROC-curves and information. We do this by systematically studying integral and variational representations of these objects and in so doing identify their primitives which all are related to cost-sensitive binary classification. As well as clarifying relations...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2009